Neural Architecture Transfer
نویسندگان
چکیده
Neural architecture search (NAS) has emerged as a promising avenue for automatically designing task-specific neural networks. Existing NAS approaches require one complete each deployment specification of hardware or objective. This is computationally impractical endeavor given the potentially large number application scenarios. In this paper, we propose Architecture Transfer (NAT) to overcome limitation. NAT designed efficiently generate custom models that are competitive under multiple conflicting objectives. To realize goal learn supernets from which specialized subnets can be sampled without any additional training. The key our approach an integrated online transfer learning and many-objective evolutionary procedure. A pre-trained supernet iteratively adapted while simultaneously searching subnets. We demonstrate efficacy on 11 benchmark image classification tasks ranging large-scale multi-class small-scale fine-grained datasets. all cases, including ImageNet, NATNets improve upon state-of-the-art mobile settings ($\leq$ 600M Multiply-Adds). Surprisingly, datasets benefit most NAT. At same time, orders magnitude more efficient than existing methods. Overall, experimental evaluation indicates that, across diverse computational objectives, appreciably effective alternative conventional fine-tuning weights network learned standard Code available at https://github.com/human-analysis/neural-architecture-transfer
منابع مشابه
Synchronous Transfer Architecture (STA)
This paper presents a novel micro-architecture for high-performance and low-power DSPs. The underlying Synchronous Transfer Architecture (STA) fills the gap between SIMD-DSPs and coarse-grain reconfigurable hardware. STA processors are modeled using a common machine description suitable for both compiler and core generator. The core generator is able to generate models in Lisa, System-C, and VH...
متن کاملSpiking Neural Network Architecture
ARM microprocessors are found in nearly every consumer device, from smartphones to gameboxes to e-readers and digital televisions. But did you know that, combined, these same ARM microprocessor cores can simulate the human brain? The Spiking Neural Network Architecture (SpiNNaker), a massively parallel neurocomputer architecture, aims to use more than one million ARM microprocessor cores to mod...
متن کاملProgressive Neural Architecture Search
We propose a method for learning CNN structures that is more efficient than previous approaches: instead of using reinforcement learning (RL) or genetic algorithms (GA), we use a sequential model-based optimization (SMBO) strategy, in which we search for architectures in order of increasing complexity, while simultaneously learning a surrogate function to guide the search, similar to A* search....
متن کاملDemystifying Neural Style Transfer
Neural Style Transfer [Gatys et al., 2016] has recently demonstrated very exciting results which catches eyes in both academia and industry. Despite the amazing results, the principle of neural style transfer, especially why the Gram matrices could represent style remains unclear. In this paper, we propose a novel interpretation of neural style transfer by treating it as a domain adaptation pro...
متن کاملStereoscopic Neural Style Transfer
This paper presents the first attempt at stereoscopic neural style transfer, which responds to the emerging demand for 3D movies or AR/VR. We start with a careful examination of applying existing monocular style transfer methods to left and right views of stereoscopic images separately. This reveals that the original disparity consistency cannot be well preserved in the final stylization result...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Pattern Analysis and Machine Intelligence
سال: 2021
ISSN: ['1939-3539', '2160-9292', '0162-8828']
DOI: https://doi.org/10.1109/tpami.2021.3052758